LU Preconditioning for Overdetermined Sparse Least Squares Problems

نویسندگان

  • Gary W. Howell
  • Marc Baboulin
چکیده

We investigate how to use an LU factorization with the classical lsqr routine for solving overdetermined sparse least squares problems. Usually L is much better conditioned than A and iterating with L instead of A results in faster convergence. When a runtime test indicates that L is not sufficiently well-conditioned, a partial orthogonalization of L accelerates the convergence. Numerical experiments illustrate the good behavior of our algorithm in terms of storage and convergence.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Preconditioned Iterative Methods for Solving Linear Least Squares Problems

New preconditioning strategies for solving m × n overdetermined large and sparse linear least squares problems using the CGLS method are described. First, direct preconditioning of the normal equations by the Balanced Incomplete Factorization (BIF) for symmetric and positive definite matrices is studied and a new breakdown-free strategy is proposed. Preconditioning based on the incomplete LU fa...

متن کامل

Least squares solution of nearly square overdetermined sparse linear systems

The solution of nearly square overdetermined linear systems is studied. The sparse QR technique is compared with two sparse LU-based techniques. Numerical tests with realworld and artificial matrices indicate that the LU techniques are more accurate for incompatible right-hand sides. Also, the amount of floating point operations required by LU techniques is approximately one half smaller than Q...

متن کامل

Preconditioning of linear least-squares problems by identifying basic variables

The preconditioning of linear least-squares problems is a hard task. The linear model underpinning least-squares problems, that is the overdetermined matrix defining it, does not have the properties of differential problems that make standard preconditioners effective. Incomplete Cholesky techniques applied to the normal equations do not produce a well conditioned problem. We attempt to remove ...

متن کامل

Iterative Scaled Trust-Region Learning in Krylov Subspaces via Pearlmutter's Implicit Sparse Hessian-Vector Multiply

The online incremental gradient (or backpropagation) algorithm is widely considered to be the fastest method for solving large-scale neural-network (NN) learning problems. In contrast, we show that an appropriately implemented iterative batch-mode (or block-mode) learning method can be much faster. For example, it is three times faster in the UCI letter classification problem (26 outputs, 16,00...

متن کامل

Approximate Generalized Inverse Preconditioning Methods for Least Squares Problems

iv erative methods to solve least squares problems more efficiently. We especially focused on one kind of preconditioners, in which preconditioners are the approximate generalized inverses of the coefficient matrices of the least squares problems. We proposed two different approaches for how to construct the approximate generalized inverses of the coefficient matrices: one is based on the Minim...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015